Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Deep neural network compression algorithm based on combined dynamic pruning
ZHANG Mingming, LU Qingning, LI Wenzhong, SONG Hu
Journal of Computer Applications    2021, 41 (6): 1589-1596.   DOI: 10.11772/j.issn.1001-9081.2020121914
Abstract322)      PDF (1131KB)(314)       Save
As a branch of model compression, network pruning algorithm reduces the computational cost by removing unimportant parameters in the deep neural network. However, permanent pruning will cause irreversible loss of the model capacity. Focusing on this issue, a combined dynamic pruning algorithm was proposed to comprehensively analyze the characteristics of the convolution kernel and the input image. Part of the convolution kernels were zeroized and allowed to be updated during the training process until the network converged, thereafter the zeroized kernels would be permanently removed. At the same time, the input images were sampled to extract their features, then a channel importance prediction network was used to analyze these features to determine the channels able to be skipped during the convolution operation. Experimental results based on M-CifarNet and VGG16 show that the combined dynamic pruning can respectively provide 2.11 and 1.99 floating-point operation compression ratios, with less than 0.8 percentage points and 1.2 percentage points accuracy loss respectively compared to the benchmark model (M-CifarNet、VGG16). Compared with the existing network pruning algorithms, the combined dynamic pruning algorithm effectively reduces the Floating-Point Operations Per second (FLOPs) and the parameter scale of the model, and achieves the higher accuracy under the same compression ratio.
Reference | Related Articles | Metrics
Novel survival of the fittest shuffled frog leaping algorithm with normal mutation
ZHANG Mingming, DAI Yueming, WU Dinghui
Journal of Computer Applications    2016, 36 (6): 1583-1587.   DOI: 10.11772/j.issn.1001-9081.2016.06.1583
Abstract498)      PDF (729KB)(418)       Save
To overcome the demerits of basic Shuffled Frog Leaping Algorithm (SFLA), such as slow convergence speed, low optimization precision and falling into local optimum easily, a novel survival of the fittest SFLA with normal mutation was proposed. In the local search strategy of the proposed algorithm, the normal mutations for updating strategy of the worst frog individuals in the subgroup were introduced to avoid the algorithm falling into local convergence effectively, expand the searching space and increase the diversity of population. Meanwhile, the mutations were selected for a small number of worse frog individual in the subgroup to inherit the useful mutations instead of the bad mutations. The survival of the fittest was implemented, the quality of the population was improved, the blindness of the algorithm optimization process was reduced and the algorithm optimization was speeded up. The elite mutation mechanism for the best frog individuals in each subgroup was introduced for obtaining better individuals to enhance the global optimization ability of the algorithm further, avoid falling into local convergence, and lead the whole population evolution to the better. The experimental results of 30 independent runs indicate that the proposed algorithm can converge to the optimal solution of 0 in Sphere, Rastrigrin, Griewank, Ackley and Quadric, which is better than the other contrastive algorithms. The experimental results show that the proposed algorithm can avoid falling into premature convergence effectively, improve the convergence speed and convergence precision.
Reference | Related Articles | Metrics